Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available April 22, 2026
-
Free, publicly-accessible full text available April 22, 2026
-
We consider partially-specified optimization problems where the goal is to actively, but efficiently, acquire missing information about the problem in order to solve it. An algo- rithm designer wishes to solve a linear pro- gram (LP), maxcT x s.t. Ax ≤ b,x ≥ 0, but does not initially know some of the pa- rameters. The algorithm can iteratively choose an unknown parameter and gather information in the form of a noisy sample centered at the parameter’s (unknown) value. The goal is to find an approximately feasible and optimal so- lution to the underlying LP with high proba- bility while drawing a small number of sam- ples. We focus on two cases. (1) When the parameters b of the constraints are initially un- known, we propose an efficient algorithm com- bining techniques from the ellipsoid method for LP and confidence-bound approaches from bandit algorithms. The algorithm adaptively gathers information about constraints only as needed in order to make progress. We give sample complexity bounds for the algorithm and demonstrate its improvement over a naive approach via simulation. (2) When the param- eters c of the objective are initially unknown, we take an information-theoretic approach and give roughly matching upper and lower sam- ple complexity bounds, with an (inefficient) successive-elimination algorithm.more » « less
An official website of the United States government

Full Text Available